Current:Home > NewsAn eating disorders chatbot offered dieting advice, raising fears about AI in health -Wealth Evolution Experts
An eating disorders chatbot offered dieting advice, raising fears about AI in health
SignalHub Quantitative Think Tank Center View
Date:2025-04-06 21:01:29
A few weeks ago, Sharon Maxwell heard the National Eating Disorders Association (NEDA) was shutting down its long-running national helpline and promoting a chatbot called Tessa as a "a meaningful prevention resource" for those struggling with eating disorders. She decided to try out the chatbot herself.
Maxwell, who is based in San Diego, had struggled for years with an eating disorder that began in childhood. She now works as a consultant in the eating disorder field. "Hi, Tessa," she typed into the online text box. "How do you support folks with eating disorders?"
Tessa rattled off a list of ideas, including some resources for "healthy eating habits." Alarm bells immediately went off in Maxwell's head. She asked Tessa for more details. Before long, the chatbot was giving her tips on losing weight - ones that sounded an awful lot like what she'd been told when she was put on Weight Watchers at age 10.
"The recommendations that Tessa gave me was that I could lose 1 to 2 pounds per week, that I should eat no more than 2,000 calories in a day, that I should have a calorie deficit of 500-1,000 calories per day," Maxwell says. "All of which might sound benign to the general listener. However, to an individual with an eating disorder, the focus of weight loss really fuels the eating disorder."
Maxwell shared her concerns on social media, helping launch an online controversy which led NEDA to announce on May 30 that it was indefinitely disabling Tessa. Patients, families, doctors and other experts on eating disorders were left stunned and bewildered about how a chatbot designed to help people with eating disorders could end up dispensing diet tips instead.
The uproar has also set off a fresh wave of debate as companies turn to artificial intelligence (AI) as a possible solution to a surging mental health crisis and severe shortage of clinical treatment providers.
A chatbot suddenly in the spotlight
NEDA had already come under scrutiny after NPR reported on May 24 that the national nonprofit advocacy group was shutting down its helpline after more than 20 years of operation.
CEO Liz Thompson informed helpline volunteers of the decision in a March 31 email, saying NEDA would "begin to pivot to the expanded use of AI-assisted technology to provide individuals and families with a moderated, fully automated resource, Tessa."
"We see the changes from the Helpline to Tessa and our expanded website as part of an evolution, not a revolution, respectful of the ever-changing landscape in which we operate."
(Thompson followed up with a statement on June 7, saying that in NEDA's "attempt to share important news about separate decisions regarding our Information and Referral Helpline and Tessa, that the two separate decisions may have become conflated which caused confusion. It was not our intention to suggest that Tessa could provide the same type of human connection that the Helpline offered.")
On May 30, less than 24 hours after Maxwell provided NEDA with screenshots of her troubling conversation with Tessa, the non-profit announced it had "taken down" the chatbot "until further notice."
NEDA says it didn't know chatbot could create new responses
NEDA blamed the chatbot's emergent issues on Cass, a mental health chatbot company that operated Tessa as a free service. Cass had changed Tessa without NEDA's awareness or approval, according to CEO Thompson, enabling the chatbot to generate new answers beyond what Tessa's creators had intended.
"By design it, it couldn't go off the rails," says Ellen Fitzsimmons-Craft, a clinical psychologist and professor at Washington University Medical School in St. Louis. Craft helped lead the team that first built Tessa with funding from NEDA.
The version of Tessa that they tested and studied was a rule-based chatbot, meaning it could only use a limited number of prewritten responses. "We were very cognizant of the fact that A.I. isn't ready for this population," she says. "And so all of the responses were pre-programmed."
The founder and CEO of Cass, Michiel Rauws, told NPR the changes to Tessa were made last year as part of a "systems upgrade," including an "enhanced question and answer feature." That feature uses generative Artificial Intelligence, meaning it gives the chatbot the ability to use new data and create new responses.
That change was part of NEDA's contract, Rauws says.
But NEDA's CEO Liz Thompson told NPR in an email that "NEDA was never advised of these changes and did not and would not have approved them."
"The content some testers received relative to diet culture and weight management can be harmful to those with eating disorders, is against NEDA policy, and would never have been scripted into the chatbot by eating disorders experts, Drs. Barr Taylor and Ellen Fitzsimmons Craft," she wrote.
Complaints about Tessa started last year
NEDA was already aware of some issues with the chatbot months before Sharon Maxwell publicized her interactions with Tessa in late May.
In October 2022, NEDA passed along screenshots from Monika Ostroff, executive director of the Multi-Service Eating Disorders Association (MEDA) in Massachusetts.
They showed Tessa telling Ostroff to avoid "unhealthy" foods and only eat "healthy" snacks, like fruit. "It's really important that you find what healthy snacks you like the most, so if it's not a fruit, try something else!" Tessa told Ostroff. "So the next time you're hungry between meals, try to go for that instead of an unhealthy snack like a bag of chips. Think you can do that?"
In a recent interview, Ostroff says this was a clear example of the chatbot encouraging "diet culture" mentality. "That meant that they [NEDA] either wrote these scripts themselves, they got the chatbot and didn't bother to make sure it was safe and didn't test it, or released it and didn't test it," she says.
The healthy snack language was quickly removed after Ostroff reported it. But Rauws says that problematic language was part of Tessa's "pre-scripted language, and not related to generative AI."
Fitzsimmons-Craft denies her team wrote that. "[That] was not something our team designed Tessa to offer and... it was not part of the rule-based program we originally designed."
Then, earlier this year, Rauws says "a similar event happened as another example."
"This time it was around our enhanced question and answer feature, which leverages a generative model. When we got notified by NEDA that an answer text [Tessa] provided fell outside their guidelines, and it was addressed right away."
Rauws says he can't provide more details about what this event entailed.
"This is another earlier instance, and not the same instance as over the Memorial Day weekend," he said in an email, referring to Maxwell's screenshots. "According to our privacy policy, this is related to user data tied to a question posed by a person, so we would have to get approval from that individual first."
When asked about this event, Thompson says she doesn't know what instance Rauws is referring to.
Despite their disagreements over what happened and when, both NEDA and Cass have issued apologies.
Ostroff says regardless of what went wrong, the impact on someone with an eating disorder is the same. "It doesn't matter if it's rule-based [AI] or generative, it's all fat-phobic," she says. "We have huge populations of people who are harmed by this kind of language everyday."
She also worries about what this might mean for the tens of thousands of people who were turning to NEDA's helpline each year.
"Between NEDA taking their helpline offline, and their disastrous chatbot....what are you doing with all those people?"
Thompson says NEDA is still offering numerous resources for people seeking help, including a screening tool and resource map, and is developing new online and in-person programs.
"We recognize and regret that certain decisions taken by NEDA have disappointed members of the eating disorders community," she said in an emailed statement. "Like all other organizations focused on eating disorders, NEDA's resources are limited and this requires us to make difficult choices... We always wish we could do more and we remain dedicated to doing better."
veryGood! (6)
Related
- The Best Stocking Stuffers Under $25
- Wisconsin DNR board appointees tell Republican lawmakers they don’t support wolf population limit
- Which 2-0 NFL teams are for real? Ranking all nine by Super Bowl contender legitimacy
- Rupert Murdoch Will Step Down as Chairman of Fox and News Corp.
- Taylor Swift Eras Archive site launches on singer's 35th birthday. What is it?
- Alex Murdaugh pleads guilty to 22 counts of financial fraud and money laundering
- Moose headbutts stomps woman, dog, marking 4th moose attack on Colorado hiker this year
- Nick Chubb’s injury underscores running backs’ pleas for bigger contracts and teams’ fears
- Nearly 400 USAID contract employees laid off in wake of Trump's 'stop work' order
- Jail where murderer Danilo Cavalcante escaped plans to wall off yard and make other upgrades
Ranking
- Stamford Road collision sends motorcyclist flying; driver arrested
- DeSantis unveils energy plan in Texas, aims to lower price of gas to $2 per gallon
- The U.N. system is ‘sclerotic and hobbled’ and needs urgent reform, top European Union official says
- Wisconsin DNR board appointees tell Republican lawmakers they don’t support wolf population limit
- Who's hosting 'Saturday Night Live' tonight? Musical guest, how to watch Dec. 14 episode
- Man dies after swarm of bees attacks him on porch of his own home
- Matt Walsh Taking Pause From Dancing With the Stars Season 32 Over Hollywood Strikes
- Could a promotion-relegation style system come to college football? One official hopes so.
Recommendation
Biden administration makes final diplomatic push for stability across a turbulent Mideast
'Probably haunted' funeral home listed for sale as 3-bedroom house with rooms 'gutted and waiting'
Elon Musk's Neuralink chip is ready to embark on its first clinical trial. Here's how to sign up.
Migrant crossings soar to near-record levels, testing Biden's border strategy
Tom Holland's New Venture Revealed
Moose headbutts stomps woman, dog, marking 4th moose attack on Colorado hiker this year
Danny Masterson's wife stood by him. Now she's filed for divorce. It's not uncommon.
Manslaughter charge added against Connecticut teen who crashed into police cruiser, killed officer